Regularization versus Early Stopping: a Case Study with a Real System
نویسندگان
چکیده
Regularization and Early Stopping are two of the most common techniques to deal with the overtraining problem in the Artificial Neural Networks field. The overtraining problem appears mostly in systems affected by noise in which after a certain amount of training, the neural network used for modelling starts to learn information specific from the training signal or the noise. It has already been shown that these techniques can be used to avoid this problem and they are formerly equivalent, but this issue deserve further investigation since real systems sometimes behave in a different way than simulated systems. A fair comparison for the two techniques is not very easy to make since in the networks there are several parameters that cannot be determined in an analytical way. To overcome this difficulty in the present work a procedure for automating the construction of the models has been used. This procedure allows creating models that are optimised in the number of inputs, the number of hidden neurons and the generalization capability using either Early Stopping or Regularization. This enables the possibility of performing a fair comparison. The procedure includes an hybrid direct/specialized training solution for evaluating the inverse model. To test the results the system used is a reduced scale prototype kiln affected by measurement noise, for which the Direct Inverse Control and Internal Model Control strategies were implemented.
منابع مشابه
Ho-Kashyap with Early Stopping Versus Soft Margin SVM for Linear Classifiers - An Application
In a classification problem, hard margin SVMs tend to minimize the generalization error by maximizing the margin. Regularization is obtained with soft margin SVMs which improve performances by relaxing the constraints on the margin maximization. This article shows that comparable performances can be obtained in the linearly separable case with the Ho–Kashyap learning rule associated to early st...
متن کاملFlash Flood Forecasting by Statistical Learning in the Absence of Rainfall Forecast: A Case Study
The feasibility of flash flood forecasting without making use of rainfall predictions is investigated. After a presentation of the “cevenol flash floods“, which caused 1.2 billion Euros of economical damages and 22 fatalities in 2002, the difficulties incurred in the forecasting of such events are analyzed, with emphasis on the nature of the database and the origins of measurement noise. The hi...
متن کاملRegularization by Early Stopping in Single Layer Perceptron Training
Adaptative training of the non-linear single-layer perceptron can lead to the Euclidean distance classifier and later to the standard Fisher linear discriminant function. On the way between these two classifiers one has a regularized discriminant analysis. That is equivalent to the “weight decay” regularization term added to the cost function. Thus early stopping plays a role of regularization ...
متن کاملDon't relax: early stopping for convex regularization
We consider the problem of designing efficient regularization algorithms when regularization is encoded by a (strongly) convex functional. Unlike classical penalization methods based on a relaxation approach, we propose an iterative method where regularization is achieved via early stopping. Our results show that the proposed procedure achieves the same recovery accuracy as penalization methods...
متن کاملLarge-scale Inversion of Magnetic Data Using Golub-Kahan Bidiagonalization with Truncated Generalized Cross Validation for Regularization Parameter Estimation
In this paper a fast method for large-scale sparse inversion of magnetic data is considered. The L1-norm stabilizer is used to generate models with sharp and distinct interfaces. To deal with the non-linearity introduced by the L1-norm, a model-space iteratively reweighted least squares algorithm is used. The original model matrix is factorized using the Golub-Kahan bidiagonalization that proje...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2003